ARROW: Automated Repair of Races on Client-Side Web Pages - download.php

نویسندگان

  • Weihang Wang
  • Yunhui Zheng
  • Peng Liu
  • Lei Xu
  • Xiangyu Zhang
  • Patrick Eugster
چکیده

Modern browsers have a highly concurrent page rendering process in order to be more responsive. However, such a concurrent execution model leads to various race issues. In this paper, we present ARROW, a static technique that can automatically, safely, and cost effectively patch certain race issues on client side pages. It works by statically modeling a web page as a causal graph denoting happens-before relations between page elements, according to the rendering process in browsers. Races are detected by identifying inconsistencies between the graph and the dependence relations intended by the developer. Detected races are fixed by leveraging a constraint solver to add a set of edges with the minimum cost to the causal graph so that it is consistent with the intended dependences. The input page is then transformed to respect the repair edges. ARROW has fixed 151 races from 20 real world commercial web sites.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

بهینه‌سازی اجرا و پاسخ صفحات وب در فضای ابری با روش‌های پیش‌پردازش، مطالعه موردی سامانه‌های وارنیش و انجینکس

The response speed of Web pages is one of the necessities of information technology. In recent years, renowned companies such as Google and computer scientists focused on speeding up the web. Achievements such as Google Pagespeed, Nginx and varnish are the result of these researches. In Customer to Customer(C2C) business systems, such as chat systems, and in Business to Customer(B2C) systems, s...

متن کامل

Hybrid Analysis for JavaScript Security Assessment

With the proliferation of Web 2.0 technologies, functionality in web applications is increasingly moving from server-side to client-side code, primarily JavaScript. The dynamic and eventdriven nature of JavaScript code, which is often machine generated or obfuscated, combined with reliance on complex frameworks and asynchronous communication, makes it difficult to perform effective security aud...

متن کامل

Crawling Web Pages with Support for Client-Side Dynamism

There is a great amount of information on the web that can not be accessed by conventional crawler engines. This portion of the web is usually known as the Hidden Web. To be able to deal with this problem, it is necessary to solve two tasks: crawling the client-side and crawling the server-side hidden web. In this paper we present an architecture and a set of related techniques for accessing th...

متن کامل

A Client Side Approach to Building the Semantic Web

In this paper, I describe an alternative approach to building a semantic web that addresses some known challenges to existing attempts. In particular, powerful information extraction techniques are used to identify concepts of interest in Web pages. Identified concepts are then used to semi-automatically construct assertions in a computerreadable markup, reducing manual annotation requirements....

متن کامل

Client Honeypot Based Malware Program Detection Embedded Into Web Pages

In today’s world where internet is hosting major resources of this world, the malware programs embedded into web pages have become a severe threat in today’s internet which launches the client side attacks by exploiting the browser based vulnerabilities. With the improvement of software security, vulnerabilities based attacks declined whereas the attacks based on the client side application is ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016